# Chinese NLU tasks
Erlangshen DeBERTa V2 710M Chinese
Apache-2.0
This is a 710M parameter DeBERTa-v2 model focused on Chinese natural language understanding tasks. It is pre-trained using the whole-word masking method, providing strong support for the Chinese NLP field.
Large Language Model
Transformers Chinese

E
IDEA-CCNL
246
13
Erlangshen DeBERTa V2 97M Chinese
Apache-2.0
A Chinese DeBERTa-v2 base model specialized in natural language understanding tasks, employing Whole Word Masking with 97 million parameters.
Large Language Model
Transformers Chinese

E
IDEA-CCNL
178
11
Featured Recommended AI Models